Goto

Collaborating Authors

 high-performance computing news analysis


MIT: New Method Uses ML to Accelerate Data Retrieval in Large Databases - High-Performance Computing News Analysis

#artificialintelligence

CAMBRIDGE, MA -- March 14, 2023 -- Researchers from MIT and other institutions report that a "hash function" -- a core database search operation -- can be significantly accelerated through the use of machine learning. The hope is that the new technique could accelerate computational systems that scientists use to store and analyze DNA, amino acid sequences, or other biological information. Hashing is used in applications from database indexing to data compression to cryptography. A hash function generates codes that directly determine the location where data would be stored. But because traditional hash functions generate codes randomly, sometimes two pieces of data can be hashed with the same value.


Cognigy Joins AWS Independent Software Vendor Accelerate Program - High-Performance Computing News Analysis

#artificialintelligence

Cognigy's acceptance into the AWS ISV Accelerate Program enables the company to collaborate more closely with the AWS organization to design and deliver superior outcomes to AWS customers leveraging the combined solution set of AWS and Cognigy. "Achieving the AWS ISV Accelerate membership so closely after earning our AWS Conversational AI Competency distinction is a clear validation of the power of our combined solutions to transform customer service," said Hardy Myers, SVP business development and strategy. "We are excited to see our relationship with AWS continue to grow. It is truly an honor to collaborate with such a talented team of enterprise experts at AWS to build the next generation of AI-powered customer service solutions." Cognigy is an official Amazon partner and one of the first certified by AWS in Conversational AI competency.


How Machine Learning Is Revolutionizing HPC Simulations - High-Performance Computing News Analysis

#artificialintelligence

Physics-based simulations, that staple of traditional HPC, may be evolving toward an emerging, AI-based technique that could radically accelerate simulation runs while cutting costs. Called "surrogate machine learning models," the topic was a focal point in a keynote on Tuesday at the International Conference on Parallel Processing by Argonne National Lab's Rick Stevens. Stevens, ANL's associate laboratory director for computing, environment and life sciences, said early work in "surrogates," as the technique is called, shows tens of thousands of times (and more) speed-ups and could "potentially replace simulations." In his keynote, entitled, "Exascale and Then What?: The Next Decade for HPC and AI," Stevens explained surrogates this way: "You have a system, it could be a molecular system or drug design…, and you have a physics-based simulation of it… You run this code and capture the input-output relationships of the core simulation… You use that training data to build an approximate model. These are typically done with neural networks… and this surrogate model approximates the simulation, and typically it is much faster. Of course, it has some errors, so then you use that surrogate model to search the space, or to advance time steps. And then maybe you do a correction step later."

  high-performance computing news analysis, simulation, stevens, (9 more...)

The Metaverse and the AI-Driven Compute Resources That Will Make It Happen - High-Performance Computing News Analysis

#artificialintelligence

Truly understanding the metaverse requires some imagination. The digital world makes so many things possible – from exact digital copies and simulations of real-world environments to digital worlds based on far ranging and inventive creativity. Tech titans like NVIDIA and Meta are taking the lead in defining these new worlds, such as creating digital twins with NVIDIA's OVX platform, or totally new worlds integrated into Meta's Horizon concept. We can only guess what the future holds for these new digital worlds and how it will integrate with our own reality. No doubt the maturing metaverse will require tremendous compute resources.